An Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization
ثبت نشده
چکیده
We first propose an adaptive accelerated proximal gradient (APG) method for minimizing strongly convex composite functions with unknown convexity parameters. This method incorporates a restarting scheme to automatically estimate the strong convexity parameter and achieves a nearly optimal iteration complexity. Then we consider the l1regularized least-squares (l1-LS) problem in the high-dimensional setting. Although such an objective function is not strongly convex, it has restricted strong convexity over sparse vectors. We exploit this property by combining the adaptive APG method with a homotopy continuation scheme, which generates a sparse solution path towards optimality. This method obtains a global linear rate of convergence and its overall iteration complexity has a weaker dependency on the restricted condition number than previous work.
منابع مشابه
An Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization
We consider optimization problems with an objective function that is the sum of two convex terms: one is smooth and given by a black-box oracle, and the other is general but with a simple, known structure. We first present an accelerated proximal gradient (APG) method for problems where the smooth part of the objective function is also strongly convex. This method incorporates an efficient line...
متن کاملA Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
We consider solving the 1-regularized least-squares ( 1-LS) problem in the context of sparse recovery for applications such as compressed sensing. The standard proximal gradient method, also known as iterative soft-thresholding when applied to this problem, has low computational cost per iteration but a rather slow convergence rate. Nevertheless, when the solution is sparse, it often exhibits f...
متن کاملA Proximal-Gradient Homotopy Method for the L1-Regularized Least-Squares Problem
We consider the `1-regularized least-squares problem for sparse recovery and compressed sensing. Since the objective function is not strongly convex, standard proximal gradient methods only achieve sublinear convergence. We propose a homotopy continuation strategy, which employs a proximal gradient method to solve the problem with a sequence of decreasing regularization parameters. It is shown ...
متن کاملAccelerated Block Coordinate Proximal Gradients with Applications in High Dimensional Statistics
Nonconvex optimization problems arise in different research fields and arouse lots of attention in signal processing, statistics and machine learning. In this work, we explore the accelerated proximal gradient method and some of its variants which have been shown to converge under nonconvex context recently. We show that a novel variant proposed here, which exploits adaptive momentum and block ...
متن کاملAccelerated Sparse Linear Regression via Random Projection
In this paper, we present an accelerated numerical method based on random projection for sparse linear regression. Previous studies have shown that under appropriate conditions, gradient-based methods enjoy a geometric convergence rate when applied to this problem. However, the time complexity of evaluating the gradient is as large as O(nd), where n is the number of data points and d is the dim...
متن کامل